Goto

Collaborating Authors

 tensor response


Reviews: Low-Rank Regression with Tensor Responses

Neural Information Processing Systems

Strength: --The paper provides the theoretical analysis of approximation guarantees and a generalization bound for the class of tensor-valued regression functions. Weakness: --A major drawback is that the novelty and contribution is rather limited. The key idea and the model of this paper is actually equivalent to the HOPLS in the following paper: [Zhao et. In HOPLS, it assumes the tensor input has low-rank structure and also the tensor output has low-rank structure, and the link of them is established in the common latent space. And then follows a regression step against the projected latent variables.


Partially Observed Dynamic Tensor Response Regression

Zhou, Jie, Sun, Will Wei, Zhang, Jingfei, Li, Lexin

arXiv.org Machine Learning

In modern data science, dynamic tensor data is prevailing in numerous applications. An important task is to characterize the relationship between such dynamic tensor and external covariates. However, the tensor data is often only partially observed, rendering many existing methods inapplicable. In this article, we develop a regression model with partially observed dynamic tensor as the response and external covariates as the predictor. We introduce the low-rank, sparsity and fusion structures on the regression coefficient tensor, and consider a loss function projected over the observed entries. We develop an efficient non-convex alternating updating algorithm, and derive the finite-sample error bound of the actual estimator from each step of our optimization algorithm. Unobserved entries in tensor response have imposed serious challenges. As a result, our proposal differs considerably in terms of estimation algorithm, regularity conditions, as well as theoretical properties, compared to the existing tensor completion or tensor response regression solutions. We illustrate the efficacy of our proposed method using simulations, and two real applications, a neuroimaging dementia study and a digital advertising study.


Structured Point Cloud Data Analysis via Regularized Tensor Regression for Process Modeling and Optimization

Yan, Hao, Paynabar, Kamran, Pacella, Massimo

arXiv.org Machine Learning

Modern measurement technologies provide the means to measure high density spatial and geometric data in three-dimensional (3D) coordinate systems, referred to as point clouds. Point cloud data analysis has broad applications in advanced manufacturing and metrology for measuring dimensional accuracy and shape analysis, in geographic information systems (GIS) for digital elevation modeling and analysis of terrains, in computer graphics for shape reconstruction, and in medical imaging for volumetric measurement to name a few. The role of point cloud data in manufacturing is now more important than ever, particularly in the field of smart and additive manufacturing processes, where products with complex shape and geometry are manufactured with the help of advanced technologies (Gibson et al., 2010). In these processes, the dimensional and geometric accuracy of manufactured parts are measured in the form of point clouds using modern sensing devices, including touch-probe coordinate measuring machines (CMM) and optical systems, such as laser scanners. Modeling the relationship of the dimensional accuracy, encapsulated in point clouds, with process parameters and machine settings is vital for variation reduction and process optimization.


STORE: Sparse Tensor Response Regression and Neuroimaging Analysis

Sun, Will Wei, Li, Lexin

arXiv.org Machine Learning

Motivated by applications in neuroimaging analysis, we propose a new regression model, Sparse TensOr REsponse regression (STORE), with a tensor response and a vector predictor. STORE embeds two key sparse structures: element-wise sparsity and low-rankness. It can handle both a non-symmetric and a symmetric tensor response, and thus is applicable to both structural and functional neuroimaging data. We formulate the parameter estimation as a non-convex optimization problem, and develop an efficient alternating updating algorithm. We establish a non-asymptotic estimation error bound for the actual estimator obtained from the proposed algorithm. This error bound reveals an interesting interaction between the computational efficiency and the statistical rate of convergence. When the distribution of the error tensor is Gaussian, we further obtain a fast estimation error rate which allows the tensor dimension to grow exponentially with the sample size. We illustrate the efficacy of our model through intensive simulations and an analysis of the Autism spectrum disorder neuroimaging data.